Search results for "Reproducing kernel Hilbert space"

showing 10 items of 11 documents

Bergman and Bloch spaces of vector-valued functions

2003

We investigate Bergman and Bloch spaces of analytic vector-valued functions in the unit disc. We show how the Bergman projection from the Bochner-Lebesgue space Lp(, X) onto the Bergman space Bp(X) extends boundedly to the space of vector-valued measures of bounded p-variation Vp(X), using this fact to prove that the dual of Bp(X) is Bp(X*) for any complex Banach space X and 1 < p < ∞. As for p = 1 the dual is the Bloch space ℬ(X*). Furthermore we relate these spaces (via the Bergman kernel) with the classes of p-summing and positive p-summing operators, and we show in the same framework that Bp(X) is always complemented in p(X). (© 2003 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)

Bloch spacePure mathematicsBergman spaceGeneral MathematicsBounded functionMathematical analysisBanach spaceInterpolation spaceSpace (mathematics)Bergman kernelReproducing kernel Hilbert spaceMathematicsMathematische Nachrichten
researchProduct

Upport vector machines for nonlinear kernel ARMA system identification.

2006

Nonlinear system identification based on support vector machines (SVM) has been usually addressed by means of the standard SVM regression (SVR), which can be seen as an implicit nonlinear autoregressive and moving average (ARMA) model in some reproducing kernel Hilbert space (RKHS). The proposal of this letter is twofold. First, the explicit consideration of an ARMA model in an RKHS (SVM-ARMA 2k) is proposed. We show that stating the ARMA equations in an RKHS leads to solving the regularized normal equations in that RKHS, in terms of the autocorrelation and cross correlation of the (nonlinearly) transformed input and output discrete time processes. Second, a general class of SVM-based syste…

Computer Science::Machine LearningStatistics::TheoryComputer Networks and CommunicationsBiomedical signal processingInformation Storage and RetrievalMachine learningcomputer.software_genrePattern Recognition AutomatedStatistics::Machine LearningArtificial IntelligenceApplied mathematicsStatistics::MethodologyAutoregressive–moving-average modelComputer SimulationMathematicsTelecomunicacionesHardware_MEMORYSTRUCTURESSupport vector machinesModels StatisticalNonlinear system identificationbusiness.industryAutocorrelationSystem identificationSignal Processing Computer-AssistedGeneral MedicineComputer Science ApplicationsSupport vector machineNonlinear systemKernelAutoregressive modelNonlinear DynamicsARMA modelling3325 Tecnología de las TelecomunicacionesArtificial intelligenceNeural Networks ComputerbusinesscomputerSoftwareAlgorithmsReproducing kernel Hilbert spaceIEEE transactions on neural networks
researchProduct

Optimizing Kernel Ridge Regression for Remote Sensing Problems

2018

Kernel methods have been very successful in remote sensing problems because of their ability to deal with high dimensional non-linear data. However, they are computationally expensive to train when a large amount of samples are used. In this context, while the amount of available remote sensing data has constantly increased, the size of training sets in kernel methods is usually restricted to few thousand samples. In this work, we modified the kernel ridge regression (KRR) training procedure to deal with large scale datasets. In addition, the basis functions in the reproducing kernel Hilbert space are defined as parameters to be also optimized during the training process. This extends the n…

Computer science0211 other engineering and technologiesHyperspectral imagingContext (language use)Basis function02 engineering and technology01 natural sciencesData set010104 statistics & probabilityKernel (linear algebra)Kernel methodKernel (statistics)Radial basis function kernel0101 mathematics021101 geological & geomatics engineeringReproducing kernel Hilbert spaceRemote sensingIGARSS 2018 - 2018 IEEE International Geoscience and Remote Sensing Symposium
researchProduct

A Unified SVM Framework for Signal Estimation

2013

This paper presents a unified framework to tackle estimation problems in Digital Signal Processing (DSP) using Support Vector Machines (SVMs). The use of SVMs in estimation problems has been traditionally limited to its mere use as a black-box model. Noting such limitations in the literature, we take advantage of several properties of Mercer's kernels and functional analysis to develop a family of SVM methods for estimation in DSP. Three types of signal model equations are analyzed. First, when a specific time-signal structure is assumed to model the underlying system that generated the data, the linear signal model (so called Primal Signal Model formulation) is first stated and analyzed. T…

FOS: Computer and information sciencesbusiness.industryNoise (signal processing)Computer scienceApplied MathematicsSpectral density estimationArray processingPattern recognitionMachine Learning (stat.ML)Statistics - ApplicationsSupport vector machineKernel (linear algebra)Kernel methodComputational Theory and MathematicsStatistics - Machine LearningArtificial IntelligenceSignal ProcessingApplications (stat.AP)Computer Vision and Pattern RecognitionArtificial intelligenceElectrical and Electronic EngineeringStatistics Probability and UncertaintybusinessDigital signal processingReproducing kernel Hilbert space
researchProduct

Explicit signal to noise ratio in reproducing kernel Hilbert spaces

2011

This paper introduces a nonlinear feature extraction method based on kernels for remote sensing data analysis. The proposed approach is based on the minimum noise fraction (MNF) transform, which maximizes the signal variance while also minimizing the estimated noise variance. We here propose an alternative kernel MNF (KMNF) in which the noise is explicitly estimated in the reproducing kernel Hilbert space. This enables KMNF dealing with non-linear relations between the noise and the signal features jointly. Results show that the proposed KMNF provides the most noise-free features when confronted with PCA, MNF, KPCA, and the previous version of KMNF. Extracted features with the explicit KMNF…

Kernel methodSignal-to-noise ratiobusiness.industryNoise (signal processing)Covariance matrixKernel (statistics)Feature extractionPattern recognitionArtificial intelligencebusinessKernel principal component analysisMathematicsReproducing kernel Hilbert space2011 IEEE International Geoscience and Remote Sensing Symposium
researchProduct

Explicit Recursive and Adaptive Filtering in Reproducing Kernel Hilbert Spaces

2014

This brief presents a methodology to develop recursive filters in reproducing kernel Hilbert spaces. Unlike previous approaches that exploit the kernel trick on filtered and then mapped samples, we explicitly define the model recursivity in the Hilbert space. For that, we exploit some properties of functional analysis and recursive computation of dot products without the need of preimaging or a training dataset. We illustrate the feasibility of the methodology in the particular case of the $\gamma$ -filter, which is an infinite impulse response filter with controlled stability and memory depth. Different algorithmic formulations emerge from the signal model. Experiments in chaotic and elect…

Mathematical optimizationComputer Networks and Communications02 engineering and technologyautoregressive and moving-averagekernel methodssymbols.namesakeArtificial Intelligence0202 electrical engineering electronic engineering information engineeringKernel adaptive filterInfinite impulse responseMathematicsfilterrecursiveHilbert space020206 networking & telecommunicationsFilter (signal processing)AdaptiveComputer Science ApplicationsAdaptive filterKernel methodKernel (statistics)symbols020201 artificial intelligence & image processingAlgorithmSoftwareReproducing kernel Hilbert spaceIEEE Transactions on Neural Networks and Learning Systems
researchProduct

Explicit recursivity into reproducing kernel Hilbert spaces

2011

This paper presents a methodology to develop recursive filters in reproducing kernel Hilbert spaces (RKHS). Unlike previous approaches that exploit the kernel trick on filtered and then mapped samples, we explicitly define model recursivity in the Hilbert space. The method exploits some properties of functional analysis and recursive computation of dot products without the need of pre-imaging. We illustrate the feasibility of the methodology in the particular case of the gamma-filter, an infinite impulse response (IIR) filter with controlled stability and memory depth. Different algorithmic formulations emerge from the signal model. Experiments in chaotic and electroencephalographic time se…

Mathematical optimizationgamma filterHilbert spaceDot productFilter (signal processing)pre-imagefunctional analysissymbols.namesakekernel methodsKernel methodKernel (statistics)symbolsRecursive filterInfinite impulse responseAlgorithmMathematicsReproducing kernel Hilbert spaceRecursive filter
researchProduct

Signal-to-noise ratio in reproducing kernel Hilbert spaces

2018

This paper introduces the kernel signal-to-noise ratio (kSNR) for different machine learning and signal processing applications}. The kSNR seeks to maximize the signal variance while minimizing the estimated noise variance explicitly in a reproducing kernel Hilbert space (rkHs). The kSNR gives rise to considering complex signal-to-noise relations beyond additive noise models, and can be seen as a useful signal-to-noise regularizer for feature extraction and dimensionality reduction. We show that the kSNR generalizes kernel PCA (and other spectral dimensionality reduction methods), least squares SVM, and kernel ridge regression to deal with cases where signal and noise cannot be assumed inde…

Noise model02 engineering and technologySNR010501 environmental sciences01 natural sciencesKernel principal component analysisSenyal Teoria del (Telecomunicació)Signal-to-noise ratioArtificial Intelligence0202 electrical engineering electronic engineering information engineeringHeteroscedastic0105 earth and related environmental sciencesMathematicsNoise (signal processing)Dimensionality reductionKernel methodsSignal classificationSupport vector machineKernel methodKernel (statistics)Anàlisi funcionalSignal ProcessingFeature extraction020201 artificial intelligence & image processingSignal-to-noise ratioComputer Vision and Pattern RecognitionAlgorithmSoftwareImatges ProcessamentReproducing kernel Hilbert spaceCausal inference
researchProduct

Some results about operators in nested Hilbert spaces

2005

With the use of interpolation methods we obtain some results about the domain of an operator acting on the nested Hilbert space {ℋf}f∈∑ generated by a self-adjoint operatorA and some estimates of the norms of its representatives. Some consequences in the particular case of the scale of Hilbert spaces are discussed.

Operator AlgebraPure mathematicsHilbert manifoldProjective LimitNuclear operatorHilbert R-treeGeneral MathematicsMathematical analysisHilbert's fourteenth problemHilbert spaceHilbert SpaceRigged Hilbert spaceCompact operator on Hilbert spaceInductive Limitsymbols.namesakesymbolsProduct SpaceReproducing kernel Hilbert spaceMathematicsRendiconti del Circolo Matematico di Palermo
researchProduct

Stability of the fixed point property in Hilbert spaces

2005

In this paper we prove that if X X is a Banach space whose Banach-Mazur distance to a Hilbert space is less than 5 + 17 2 \sqrt {\frac {5+\sqrt {17}}{2}} , then X X has the fixed point property for nonexpansive mappings.

Pure mathematicsIsolated pointHilbert manifoldApproximation propertyApplied MathematicsGeneral MathematicsInfinite-dimensional vector functionMathematical analysisBanach manifoldRigged Hilbert spaceFixed-point propertyReproducing kernel Hilbert spaceMathematicsProceedings of the American Mathematical Society
researchProduct